79 research outputs found
The main obstacles to better research data management and sharing are cultural. But change is in our hands
Recommendations on how to better support researchers in good data management and sharing practices are typically focused on developing new tools or improving infrastructure. Yet research shows the most common obstacles are actually cultural, not technological. Marta Teperek and Alastair Dunning outline how appointing data stewards and data champions can be key to improving research data management through positive cultural change
Research data should be available long-term...but who is going to pay?
There is now a broad consensus that sharing and preserving data makes research more efficient, reproducible and potentially innovative. As such, most funding bodies now require research data to be stored, preserved, and made available long-term. But who is going to pay for this to happen? Marta Teperek and Alastair Dunning outline how the costs of long-term data preservation are not eligible for inclusion as part of any funding body’s grants. Neither is it currently realistic for these costs to be absorbed by research institutions. With discussions between funding bodies and institutions yet to bear fruit, perhaps it is time for joined-up national (or international) efforts on data preservation
Are the FAIR Data Principles fair?
This practice paper describes an ongoing research project to test the effectiveness and relevance of the FAIR Data Principles. Simultaneously, it will analyse how easy it is for data archives to adhere to the principles. The research took place from November 2016 to January 2017, and will be underpinned with feedback from the repositories.
The FAIR Data Principles feature 15 facets corresponding to the four letters of FAIR - Findable, Accessible, Interoperable, Reusable. These principles have already gained traction within the research world. The European Commission has recently expanded its demand for research to produce open data. The relevant guidelines1are explicitly written in the context of the FAIR Data Principles. Given an increasing number of researchers will have exposure to the guidelines, understanding their viability and suggesting where there may be room for modification and adjustment is of vital importance.
This practice paper is connected to a dataset(Dunning et al.,2017) containing the original overview of the sample group statistics and graphs, in an Excel spreadsheet. Over the course of two months, the web-interfaces, help-pages and metadata-records of over 40 data repositories have been examined, to score the individual data repository against the FAIR principles and facets. The traffic-light rating system enables colour-coding according to compliance and vagueness. The statistical analysis provides overall, categorised, on the principles focussing, and on the facet focussing results.
The analysis includes the statistical and descriptive evaluation, followed by elaborations on Elements of the FAIR Data Principles, the subject specific or repository specific differences, and subsequently what repositories can do to improve their information architecture.
(1) H2020 Guidelines on FAIR Data Management:http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-data-mgt_en.pd
Data Stewardship Addressing Disciplinary Data Management Needs
One of the biggest challenges for multidisciplinary research institutions which provide data management support to researchers is addressing disciplinary differences (Akers and Doty,2013). Centralised services need to be general enough to cater for all the different flavours of research conducted in an institution. At the same time, focusing on the common denominator means that subject-specific differences and needs may not be effectively addressed. In 2017, Delft University of Technology (TU Delft) embarked on an ambitious Data Stewardship project, aiming to comprehensively address data management needs across a multi-disciplinary campus. In this article we describe the principles behind the Data Stewardship project at TU Delft, the progress so far, identify the key challenges and explain our plans for the future
From Passive to Active, From Generic to Focussed: How Can an Institutional Data Archive Remain Relevant in a Rapidly Evolving Landscape?
Founded in 2008 as an initiative of the libraries of three of the four technical universities in the Netherlands, the 4TU.Centre for Research Data (4TU.Research Data) has provided a fully operational, cross-institutional, long-term archive since 2010, storing data from all subjects in applied sciences and engineering. Presently, over 90% of the data in the archive is geoscientific data coded in netCDF (Network Common Data Form) – a data format and data model that, although generic, is mostly used in climate, ocean and atmospheric sciences. In this practice paper, we explore the question of how 4TU.Research Data can stay relevant and forward-looking in a rapidly evolving research data management landscape. In particular, we describe the motivation behind this question and how we propose to address it
Recommended from our members
Advancing Research Data Management in Universities of Science and Technology
The white paper ‘Advancing Research Data Management in Universities of Science and Technology’ shares insights on the state-of-the-art in research data management, and recommendations for advancement.
A core part of the paper are the results of a survey, which was distributed to our member institutions in 2019 and addressed the following aspects of research data management (RDM): (i) the establishment of a RDM policy at the university; (ii) the provision of suitable RDM infrastructure and tools; and (iii) the establishment of RDM support services and trainings tailored to the requirements of science and technology disciplines.
The paper reveals that while substantial progress has been made, there is still a long way to go when it comes to establishing “advanced-degree programmes at our major universities for the emerging field of data scientist”, as recommended in the seminal 2010 report ‘Riding the Wave’, and our white paper offers concrete recommendations and best practices for university leaders, researchers, operational staff, and policy makers.
The topic of RDM has become a focal point in many scientific disciplines, in Europe and globally. The management and full utilisation of research data are now also at the top of the European agenda, as exemplified by Ursula von der Leyen addressat this year’s World Economic Forum.However, the implementation of RDM remains divergent across Europe.
The white paper was written by a diverse team of RDM specialists, including data scientists and data stewards, with the work led by the RDM subgroup of our Task Force Open Science. The writing team included Angelina Kraft (Head of Lab Research Data Services at TIB, Leibniz University Hannover) who said: “The launch of RDM courses and teaching materials at universities of science and technology is a first important step to motivate people to manage their data. Furthermore, professors and PIs of all disciplines should actively support data management and motivate PhD students to publish their data in recognised digital repositories.”
Another part of the writing team was Barbara Sanchez (Head of Centre for Research Data Management, TU Wien) and Malgorzata Goraczek (International Research Support / Data Management Support, TU Wien) who added:“A reliable research data infrastructure is a central component of any RDM service. In addition to the infrastructure, proper RDM is all about communication and cooperation. This includes bringing tools, infrastructures, staff and units together.”
Alastair Dunning (Head of 4TU.ResearchData, Delft University of Technology), also one of the writers, added: “There is a popular misconception that better research data management only means faster and more efficient computers. In this white paper, we emphasise the role that training and a culture of good research data management must play.
Beyond Infrastructure -- Modelling Scholarly Research and Collaboration
International audienceThis paper explores what is needed to foster an acceptance of digital practices in the humanities beyond the creation of pure infrastructure, specifically in terms of understanding and technically modelling traditional scholarly research within a digital medium while enabling new modes of scholarly work that could only be carried out within a digitally-mediated environment
GDPR in research - what does it mean for research institutions?
Collection of materials from the event "GDPR in research - what does it mean for research institutions?" which was hosted by TU Delft Library on 30 August 2018. The collection includes the following materials: The programme of the event The welcome slide All presentations from the event All authors and event organisers are listed in alphabetical order. Any questions about these materials should be addressed to [email protected]
- …